Goto

Collaborating Authors

 accurate deep neural network


SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Neural Information Processing Systems

Homomorphic Encryption (HE) is one of the most promising security solutions to emerging Machine Learning as a Service (MLaaS). Several Leveled-HE (LHE)-enabled Convolutional Neural Networks (LHECNNs) are proposed to implement MLaaS to avoid the large bootstrapping overhead. However, prior LHECNNs have to pay significant computational overhead but achieve only low inference accuracy, due to their polynomial approximation activations and poolings. Stacking many polynomial approximation activation layers in a network greatly reduces the inference accuracy, since the polynomial approximation activation errors lead to a low distortion of the output distribution of the next batch normalization layer. So the polynomial approximation activations and poolings have become the obstacle to a fast and accurate LHECNN model.


Reviews: SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Neural Information Processing Systems

Main contribution: The paper shows how to implement an accurate homomorphic ReLU and homomorphic max-pooling operation. This is achieved by combining the idea of logarithmic quantization followed by shifting and adding operations with the basic approach of TFHE (Fast Fully Momorphic Encryption over the Torus). Further they also note that 5 bit representations are sufficient for weights, but the intermediate results of accumulation need 16 bit representation to avoid degrading accuracy. Thus they propose mixed bitwidth accumulators to avoid unnecessary computational costs. By using these few key ideas the authors show how TFHE can now support fast matrix multiplications and convolutions which previously were extremely slow.


Reviews: SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Neural Information Processing Systems

Following previous results in this field, they propose the use of Homomorphic Encryption (HE). The authors use a different HE scheme then previous authors did which allows them to compute ReLUs and other activations that were only approximated in previous studies. They manage to do that while preserving relatively good computation time. This is a significant contribution as it suggest an alternative approach to the approaches used before. The authors may wish to include more recent results in Table 4.


SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Neural Information Processing Systems

Homomorphic Encryption (HE) is one of the most promising security solutions to emerging Machine Learning as a Service (MLaaS). Several Leveled-HE (LHE)-enabled Convolutional Neural Networks (LHECNNs) are proposed to implement MLaaS to avoid the large bootstrapping overhead. However, prior LHECNNs have to pay significant computational overhead but achieve only low inference accuracy, due to their polynomial approximation activations and poolings. Stacking many polynomial approximation activation layers in a network greatly reduces the inference accuracy, since the polynomial approximation activation errors lead to a low distortion of the output distribution of the next batch normalization layer. So the polynomial approximation activations and poolings have become the obstacle to a fast and accurate LHECNN model.


SHE: A Fast and Accurate Deep Neural Network for Encrypted Data

Lou, Qian, Jiang, Lei

Neural Information Processing Systems

Homomorphic Encryption (HE) is one of the most promising security solutions to emerging Machine Learning as a Service (MLaaS). Several Leveled-HE (LHE)-enabled Convolutional Neural Networks (LHECNNs) are proposed to implement MLaaS to avoid the large bootstrapping overhead. However, prior LHECNNs have to pay significant computational overhead but achieve only low inference accuracy, due to their polynomial approximation activations and poolings. Stacking many polynomial approximation activation layers in a network greatly reduces the inference accuracy, since the polynomial approximation activation errors lead to a low distortion of the output distribution of the next batch normalization layer. So the polynomial approximation activations and poolings have become the obstacle to a fast and accurate LHECNN model.